On proximal subgradient splitting method for minimizing the sum of two nonsmooth convex functions

نویسنده

  • José Yunier Bello Cruz
چکیده

In this paper we present a variant of the proximal forward-backward splitting method for solving nonsmooth optimization problems in Hilbert spaces, when the objective function is the sum of two nondifferentiable convex functions. The proposed iteration, which will be call the Proximal Subgradient Splitting Method, extends the classical projected subgradient iteration for important classes of problems, exploiting the additive structure of the objective function. The weak convergence of the generated sequence was established using different stepsizes and under suitable assumptions. Moreover, we analyze the complexity of the iterates.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Proximal point algorithms for nonsmooth convex optimization with fixed point constraints

The problem of minimizing the sum of nonsmooth, convex objective functions defined on a real Hilbert space over the intersection of fixed point sets of nonexpansive mappings, onto which the projections cannot be efficiently computed, is considered. The use of proximal point algorithms that use the proximity operators of the objective functions and incremental optimization techniques is proposed...

متن کامل

Parallel Subgradient Method for Nonsmooth Convex Optimization with a Simple Constraint

In this paper, we consider the problem of minimizing the sum of nondifferentiable, convex functions over a closed convex set in a real Hilbert space, which is simple in the sense that the projection onto it can be easily calculated. We present a parallel subgradient method for solving it and the two convergence analyses of the method. One analysis shows that the parallel method with a small con...

متن کامل

Proximally Guided Stochastic Subgradient Method for Nonsmooth, Nonconvex Problems

In this paper, we introduce a stochastic projected subgradient method for weakly convex (i.e., uniformly prox-regular) nonsmooth, nonconvex functions—a wide class of functions which includes the additive and convex composite classes. At a high-level, the method is an inexact proximal point iteration in which the strongly convex proximal subproblems are quickly solved with a specialized stochast...

متن کامل

A proximal cutting plane method using Chebychev center for nonsmooth convex optimization

An algorithm is developped for minimizing nonsmooth convex functions. This algortithm extends Elzinga-Moore cutting plane algorithm by enforcing the search of the next test point not too far from the previous ones, thus removing compactness assumption. Our method is to Elzinga-Moore’s algorithm what a proximal bundle method is to Kelley’s algorithm. As in proximal bundle methods, a quadratic pr...

متن کامل

A doubly stabilized bundle method for nonsmooth convex optimization

We propose a bundle method for minimizing nonsmooth convex functions that combines both the level and the proximal stabilizations. Most bundle algorithms use a cutting-plane model of the objective function to formulate a subproblem whose solution gives the next iterate. Proximal bundle methods employ the model in the objective function of the subproblem, while level methods put the model in the...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014